A Simple Algorithm for Maximum Margin Classification, Revisited
نویسنده
چکیده
Let P be a point set of n points in R. Every point has a label/color (say black or white), but we do not know the labels. In particular, let B and W be the set of black and white points in P . Furthermore, let ∆ = diam(P ), and assume that there exist two parallel hyperplanes h, h′ in distance γ from each other, such that the slab between h and h′ does not contain an point of P , and the points of B are on one side of this slab, and the points of W are on the other side. The quantity γ is the margin of P . A somewhat more convenient way to handle such slabs, is to consider two points b and w in R. Let slab(b,w) be the region of points in R, such that their projection onto the line spanned by b and w is contained in the open segment bw. We use (1 − ε)slab(b,w) to denote the slab formed from slab(b,w) by shrinking it by a factor of (1− ε) around its middle hyperplane. Formally, it is defined as (1− ε)slab(b,w) = slab(b′,w′), where b′ = (1− ε/2)b + (ε/2)w and w′ = (ε/2)b + (1− ε/2)w. In the following, we assume have an access to a labeling oracle that can return the label of a specific query point. Similarly, we assume access to a counterexample oracle , such that given a slab that does not contain any points of P in its interior, and supposedly separates the points of P into B and W, it returns a point that is mislabeled by this classifier (i.e., slab) if such a point exists. Conceptually, asking queries from the oracles is quite expensive, and the algorithm tries to minimize the number of such queries.
منابع مشابه
SoftDoubleMinOver: A Simple Procedure for Maximum Margin Classification
The well-known MinOver algorithm is a simple modification of the perceptron algorithm and provides the maximum margin classifier without a bias in linearly separable two class classification problems. DoubleMinOver as a slight modification of MinOver is introduced, which now includes a bias. It is shown how this simple and iterative procedure can be extended to SoftDoubleMinOver for classificat...
متن کاملMaximum Margin Coresets for Active and Noise Tolerant Learning
We study the problem of learning large margin halfspaces in various settings using coresets and show that coresets are a widely applicable tool for large margin learning. A large margin coreset is a subset of the input data sufficient for approximating the true maximum margin solution. In this work, we provide a direct algorithm and analysis for constructing large margin coresets. We show vario...
متن کاملSimple Incremental One-Class Support Vector Classification
We introduce the OneClassMaxMinOver (OMMO) algorithm for the problem of one-class support vector classification. The algorithm is extremely simple and therefore a convenient choice for practitioners. We prove that in the hard-margin case the algorithm converges with O(1/ √ t) to the maximum margin solution of the support vector approach for one-class classification introduced by Schölkopf et al...
متن کاملPresentation of quasi-linear piecewise selected models simultaneously with designing of bump-less optimal robust controller for nonlinear vibration control of composite plates
The idea of using quasi-linear piecewise models has been established on the decomposition of complicated nonlinear systems, simultaneously designing with local controllers. Since the proper performance and the final system close loop stability are vital in multi-model controllers designing, the main problem in multi-model controllers is the number of the local models and their position not payi...
متن کاملMaximum Relative Margin and Data-Dependent Regularization
Leading classification methods such as support vector machines (SVMs) and their counterparts achieve strong generalization performance by maximizing the margin of separation between data classes. While the maximum margin approach has achieved promising performance, this article identifies its sensitivity to affine transformations of the data and to directions with large data spread. Maximum mar...
متن کاملMaxMinOver Regression: A Simple Incremental Approach for Support Vector Function Approximation
The well-known MinOver algorithm is a simple modification of the perceptron algorithm and provides the maximum margin classifier without a bias in linearly separable two class classification problems. In [1] and [2] we presented DoubleMinOver and MaxMinOver as extensions of MinOver which provide the maximal margin solution in the primal and the Support Vector solution in the dual formulation by...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1507.01563 شماره
صفحات -
تاریخ انتشار 2015